Today she will give a talk on physics-inspired equivariant machine learning.
Please solidate so you can start.
Thank you.
Thank you so much for the invitation and for the introduction.
I'm going to talk about physics-inspired equivariant machine learning.
And please ask questions if you have questions during the talk.
I'm very happy to answer and whatever you want to discuss.
So this talk is in the context of symmetries in deep learning.
So how can one exploit symmetries in the design of machine learning models?
And what can symmetries give us?
Symmetries have been around for a long time,
not only by explicitly imposing symmetries,
but also implicitly they have been exploited
in the signal processing communities for a long time.
So for instance, convolutional neural networks
are said to exploit translation and rotation symmetries
in natural images by applying the same convolutional filters
at different locations of the images.
And recurrent neural networks also exploit
some sort of translation symmetry in the way
that they are designed, where they apply the same recurrent unit
at different locations.
You can say similar things for tension networks and transformers.
And also graph neural networks are
designed to satisfy certain symmetries that
are natural for graphs, in particular,
if you write a graph like this in an adjacency matrix,
the representation is not unique.
So there's a symmetry group that acts on the space
of the adjacency matrices preserving the graph.
And one would want to design classes of functions,
graph neural networks, that are compatible with the synergies
that are natural for the space.
And so what I'm going to talk about today
has to do on how one can use symmetries
and how these symmetries can give you
a better inductive bias in physics problems.
But this is applicable to other problems as well.
So I'm going to talk about invariant and equivariant
machine learning.
This is a topic that has been around for a few years.
And basically what I mean with invariant and equivariant
is that say G is a group acting on your data space,
a function that takes an input and gives an output
is invariant if you apply the group action to your inputs,
the function doesn't change.
So here I have this flower.
And then I apply a rotation or symmetry
with respect to some axis.
And the classifier takes the input as the image and outputs,
Zugänglich über
Offener Zugang
Dauer
00:55:11 Min
Aufnahmedatum
2022-06-29
Hochgeladen am
2022-06-29 14:06:03
Sprache
en-US